Learning and Evaluating Possibilistic Decision Trees using Information Affinity

نویسندگان

  • Ilyes Jenhani
  • Salem Benferhat
  • Zied Elouedi
چکیده

This paper investigates the issue of building decision trees from data with imprecise class values where imprecision is encoded in the form of possibility distributions. The Information Affinity similarity measure is introduced into the well-known gain ratio criterion in order to assess the homogeneity of a set of possibility distributions representing instances’s classes belonging to a given training partition. For the experimental study, we proposed an information affinity based performance criterion which we have used in order to show the performance of the approach on well-known benchmarks. Keywords—Data mining from uncertain data, Decision Trees, Possibility Theory.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Complexity of Decision Making in Possibilistic Decision Trees

When the information about uncertainty cannot be quantified in a simple, probabilistic way, the topic of possibilistic decision theory is often a natural one to consider. The development of possibilistic decision theory has lead to a series of possibilistic criteria, e.g pessimistic possibilistic qualitative utility, possibilistic likely dominance , binary possibilistic utility and possibilisti...

متن کامل

Necessity-Based Choquet Integrals for Sequential Decision Making under Uncertainty

Possibilistic decision theory is a natural one to consider when information about uncertainty cannot be quantified in probabilistic way. Different qualitative criteria based on possibility theory have been proposed, the definition of which requires a finite ordinal, non compensatory, scale for evaluating both utility and plausibility. In presence of heterogeneous information, i.e. when the know...

متن کامل

Possibilistic Networks with Local Structure

A recent topic in probabilistic network learning is to exploit local network structure, i.e. to capture regularities in the conditional probability distributions, and to learn networks with local structure from data. In this paper we apply this idea to possibilistic networks, i.e. we try to capture regularities in conditional possibility distributions, and present a modification of the learning...

متن کامل

Concepts for Probabilistic and Possibilistic Induction of Decision Trees on Real World Data

The induction of decision trees from data is a well-known method for learning classifiers. The success of this method depends to a high degree on the measure used to select the next attribute, which, if tested, will improve the accuracy of the classification. This paper examines some possibility-based selection measures and compares them to probabilityand information-based measures on real worl...

متن کامل

Entropy evaluation based on confidence intervals of frequency estimates : Application to the learning of decision trees

Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree, the examples become rarer and the faithfulness of entropy decreases. Thus, misleading choices and over-fitting may occur and the tree has to be adjusted by using an early-stop criterion or post pruning algorithms. However, these methods still depends on the choices previously made, which may be ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008